Autonomous Visual Navigation for a Flower Pollination Drone
نویسندگان
چکیده
In this paper, we present the development of a visual navigation capability for small drone enabling it to autonomously approach flowers. This is very important step towards fully autonomous flower pollinating nanodrone. The developed totally and relies its on on-board color camera, complemented with one simple ToF distance sensor, detect flower. proposed solution uses DJI Tello carrying Maix Bit processing board capable running all deep-learning-based image algorithms on-board. We two-stage servoing algorithm that first highly optimized object detection CNN localize flowers fly it. second phase, approaching flower, implemented by direct steering CNN. enables any in neighborhood, steer make drone’s rod touch trained deep learning models based an artificial dataset mix images real flowers, (synthetic) virtually rendered Our experiments demonstrate technically feasible. able detect, autonomously. 10 cm sized prototype sunflowers, but methodology presented paper can be retrained type.
منابع مشابه
Autonomous Drone
For our senior project we modified a quadcopter so that it can fly autonomously and land whenever a camera detects a red target. The quadcopter uses four infrared sensors to avoid obstacles and an off-board camera to detect targets. Whenever the camera sees a red target, it tells the quadcopter to land. Our project was mostly successful; we were not able to attach the camera to the quadcopter d...
متن کاملVisual curb localization for autonomous navigation
Percept–referenced commanding is an attractive paradigm for autonomous navigation over long distances. Rather than relying on precise prior environment maps and self–localization, this approach uses high–level primitives that refer to environmental features. The assumption is that the sensing and processing system onboard the robot should be able to extract such features reliably. In this conte...
متن کاملAutonomous Visual Navigation for Planetary Exploration Rovers
SPARTAN (SPAring Robotics Technologies for Autonomous Navigation) and its extension SEXTANT (Spartan EXTension Activity Not Tendered) are two robotic exploration technology development activities funded by ESA. They target the development of computer vision algorithms for visual navigation that will be suitable for use by Martian rovers. This paper summarizes our on-going efforts in the context...
متن کاملGlobally Consistent Mosaicking for Autonomous Visual Navigation
Mobile robot localization from large-scale appearance mosaics has been showing increasing promise as a low-cost, high-performance and infrastructure-free solution to vehicle guidance in man-made environments. The feasibility of this technique relies on the construction of a highresolution mosaic of the vehicle’s environment. For reliable position estimation, the mosaic must be locally distortio...
متن کاملTowards Fully Autonomous Visual Navigation
This thesis addresses some key issues which affect the level of autonomy inherent in visual navigation systems, with wider applicability in a range of fields. They can be divided into two areas. Firstly, automated initialisation, in which the kinematic and camera calibration parameters needed for an active camera platform are calculated without user interaction. Secondly, the complexity problem...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Machines
سال: 2022
ISSN: ['2075-1702']
DOI: https://doi.org/10.3390/machines10050364